Introduction to "Particle Metropolis-Hastings using gradient and Hessian information" by J. Dahlin, F. Lindsten, T. Schön

نویسنده

  • Christophe Andrieu
چکیده

The authors investigate the use of metropolis adjusted Langevin algorithms (MALA) in the context of particle MCMC algorithms. The ability to use this type of updates can lead to more efficient MCMC algorithms. The challenge in this context is that MALA and more sophisticated versions require the evaluation of the gradient of the loglikelihood and/or its Hessian, which are not available analytically. A way around this consists of estimating these quantities numerically. This can be achieved efficiently by exploiting the particle filter output used to compute the estimator of the likelihood function required to implement particle MCMCs. This idea was proposed originally in Doucet et al. (2011) and similar ideas have been subsequently investigated in Nemeth and Fearnhead (2014) and Dahlin et al. (2013) and Dahlin et al. (2014). The authors’ contribution here is to propose solutions to the fact that the estimated Hessian can be negative and an extensive numerical evaluation of the performance properties of the algorithmwhich results in some guidelines concerning

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Particle Metropolis-Hastings using gradient and Hessian information

Particle Metropolis-Hastings (PMH) allows for Bayesian parameter inference in nonlinear state space models by combining MCMC and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC proposal for the parameters, typically a Gaussian random walk. However, this can lead to a poor exploration of the parameter sp...

متن کامل

Quasi-Newton particle Metropolis-Hastings?

Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new proposal inspired by quasi-Newton algorithms that may achieve better mixing with less tuning. An advantage...

متن کامل

Second-order particle MCMC for Bayesian parameter inference

We propose an improved proposal distribution in the Particle Metropolis-Hastings (PMH) algorithm for Bayesian parameter inference in nonlinear state space models. This proposal incorporates second-order information about the parameter posterior distribution, which can be extracted from the particle filter already used within the PMH algorithm. The added information makes the proposal scale-inva...

متن کامل

Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, decisions and predictions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state space models. There is no closedform solution available for this problem, implying that we...

متن کامل

A Bayesian Matrix Model for Relational Data

Relational learning can be used to augment one data source with other correlated sources of information, to improve predictive accuracy. We frame a large class of relational learning problems as matrix factorization problems, and propose a hierarchical Bayesian model. Training our Bayesian model using random-walk Metropolis-Hastings is impractically slow, and so we develop a block Metropolis-Ha...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Statistics and Computing

دوره 25  شماره 

صفحات  -

تاریخ انتشار 2015